31 research outputs found

    The application of neural network data mining algorithm into mixed pixel classification in geographic information system environment

    Get PDF
    With the rapid growth of satellite technology and the increasing of spatial resolution, hyperspectral imaging sensor is frequently used for research and development as well as in some semi-operational scenarios. The hyperspectral image also offers unique applications such as terrain delimitations, object detection, material identification, and atmospheric characterization. However, hyperspectral image systems produce large data sets that are not easily interpretable by visual analysis and therefore require automated processing algorithm. The challenging of pattern recognition associated with hyperspectral images is very complex processing due to the presence of considerable number of mixed pixels. This , paper discusses the development of data mining and pattern recognition algorithm to handle the complexity of hyperspectral remote sensing images in Geographical Information Systems environment. Region growing segmentation and radial basis function algorithms are considered a powerful tool to minimize the mixed pixel classification error

    Conceptual development of resources discovery in the proposed hybrid P2P video streaming

    Get PDF
    We present the design of a hybrid Peer-to-Peer (P2P) system for video streaming. In this paper, we address the availability, accessibility and lookup service of files. We use the advantages of server-client business model to search and retrieve the information. We implement the base ontology of video domain repository so that the final result may be different and provide more results from the keyword search. To provide the dynamic standby peer, we use checksum value as an indicator to search an identical content in the Peer-to-Peer network. We hypothesize that, by using server-client searching in Peer-to-Peer application, we can reduce the latency lookup services, path length, peer load and network traffic

    Integrated Features by Administering the Support Vector Machine (SVM) of Translational Initiations Sites in Alternative Polymorphic Contex

    Get PDF
    Many algorithms and methods have been proposed for classification problems in bioinformatics. In this study, the discriminative approach in particular support vector machines (SVM) is employed to recognize the studied TIS patterns. The applied discriminative approach is used to learn about some discriminant functions of samples that have been labelled as positive or negative. After learning, the discriminant functions are employed to decide whether a new sample is true or false. In this study, support vector machines (SVM) is employed to recognize the patterns for studied translational initiation sites in alternative weak context. The method has been optimized with the best parameters selected; c=100, E=10-6 and ex=2 for non linear kernel function. Results show that with top 5 features and non linear kernel, the best prediction accuracy achieved is 95.8%. J48 algorithm is applied to compare with SVM with top 15 features and the results show a good prediction accuracy of 95.8%. This indicates that the top 5 features selected by the IGR method and that are performed by SVM are sufficient to use in the prediction of TIS in weak contexts

    Crystal Method For Accurate Software Duration Estimation

    Get PDF
    All projects share one common characteristic that is the projection of ideas and activities into new endeavours. The ever-present element of risk and uncertainty to the events and tasks leading to completion can never be foretold with absolute accuracy. The software projects are different from other projects. Underestimation is the root course of many software projects not being able to meet the deadline, or failure. Some of the reasons for inaccurate estimation are as follows: the traditional model not being able to capture the project in detail, quick and reliable strategic analysis. The influence of human factor is not able to incorporate explicitly. Failure to consider rework phenomenon. Failure to capture dynamic interaction between technical development and management policies

    The development and the use of clearing house server infrastructure for gis interoperabilty

    Get PDF
    Rapid development of internet technology allows more spatial and tabular data available and accessible publicly. But those data can not be utilized directly due to the differences in data acquisition techniques, data definition and their semantic meaning. This situation reveals the need of interoperable GIS to support seamless information sharing. This is the future GIS architecture that may increase the reusability of available spatial datasets and reduce data acquisition cost. Under the whole framework of the on going research has identified four components (modules) to support the application of GJS interoperability. This paper discusses the development and the use of clearinghouse component (module) to facilitate end users to search, locate and retrieve the required spatial information or meta data from different organization at different time and places. This allows making further enquire about particular spatial data-information. The detail discussions on GIS Interoperability model as well as the development of architecture of these clearinghouse servers are given. The development of clearing house server employs an open sources approach and open file format. In this regards, fully available source code allows modification and customization without licensing restrictions

    Information Security

    Get PDF

    An Image Dithering via Tchebichef Moment Transform

    Get PDF
    Many image display applications and printing devices allow only limited number of colours. They have limited computational power and storage to produce high quality outputs on high bit-depth colour image. A dithering technique is called for here in order to improve the perceptual visual quality of the limited bitdepth images. A dithered image is represented by a natural colour in the low bit depth image colour for displaying and printing. This technique obtains low cost colour image in displaying the colour and printing image pixels. This study proposes the dithering technique based on Tchebichef Moment Transform (TMT) to produce high quality image at low-bit colour. Earlier, a 2´2 Discrete Wavelet Transform (DWT) has been proposed for better image quality on dithering. The 2´2 TMT has been chosen here since it performs better than the 2´2 DWT. TMT provides a compact support on 2´2 blocks. The result shows that 2´2 TMT gives perceptually better quality on colour image dithering in significantly efficient fashio

    A Generic Psychovisual Error Threshold for the Quantization Table Generation on JPEG Image Compression

    Get PDF
    The quantization process is a main part of image compression to control visual quality and the bit rate of the image output. The JPEG quantization tables are obtained from a series of psychovisual experiments to determine a visual threshold. The visual threshold is useful in handling the intensity level of the colour image that can be perceived visually by the human visual system. This paper will investigate a psychovisual error threshold at DCT frequency on the grayscale image. The DCT coefficients are incremented one by one for each frequency order. Whereby, the contribution of DCT coefficients to the error reconstruction will be a primitive pyschovisual error. At certain threshold being set on this psychovisual error, the new quantization table can be generated. The experimental results show that the new quantization table from psychovisual error threshold for DCT basis functions gives better quality image at lower average bit length of Huffman code than standard JPEG image compression

    CNC PCB drilling machine using novel natural approach to euclidean TSP

    Get PDF
    Nowadays, many industries use the Computerized Numerical Control (CNC) for Printed Circuit Board (PCB) drilling machines in industrial operations. It takes a long time to find optimal tour for large number of nodes (up to thousands). To achieve more effective results, optimization systems approach is required to be equipped in drilling machine. Euclidean Traveling Salesman Problem (TSP) is one of optimization method that gives fast near optimal solution for the drilling machine movement using novel friendly techniques. This paper describes the development of that CNC PCB drilling machine with novel approach to Euclidean TSP. This design can be widely applied to various CNC PCB drilling machines in small and medium scale manufacturing industries

    Systematic literature review on enhancing recommendation system by eliminating data sparsity

    Get PDF
    The aim of this project is to develop an approach using machine learning and matrix factorization to improve recommendation system. Nowadays, recommendation system has become an important part of our lives. It has helped us to make our decision-making process easier and faster as it could recommend us products that are similar with our taste. These systems can be seen everywhere such as online shopping or browsing through film catalogues. Unfortunately, the system still has its weakness where it faced difficulty in recommending products if there are insufficient reviews left by the users on products. It is difficult for the system to recommend said products because it is difficult to pinpoint what kind of users would be interested in the products. Research studies have used matrix factorization as the standard to solve this issue but lately, machine learning has come up as a good alternative to solve data sparsity. This project compares results of the recommendation system using RMSE to see how each proposed methods performs using three different datasets from MovieLens. We have selected two models – matrix factorization with SVD and deep learning-based model to evaluate these approaches and understand why they are popular solution to data sparsity. We have found that SVD brought in a lower RMSE as compared to deep learning. The reason behind this was discussed in the latter chapter of this thesis. We have also found possible research in capitalising categorical variables in recommendation system and the experiment achieved a lower RMSE score as compared to SVD and deep learning, showing the many possibilities of the future directions of the research in recommendation system
    corecore